Memory-Query Tradeoffs for Randomized Convex Optimization
September 20, 2023 (NSH 3305)

Abstract: We show that any randomized first-order algorithm which minimizes a $d$-dimensional, $1$-Lipschitz convex function over unit ball must either use $\Omega(d^{2-\delta})$ bits of memory or make $\Omega(d^{1+\delta/6-o(1)})$ queries, for any constant $\delta\in (0,1)$ and when the precision is quasipolynomially small in $d$. Our result implies that cutting plane methods, which use $O(d^2)$ bits of memory and $O(d)$ queries, are pareto-optimal among randomized first-order algorithms, and quadratic memory is required to achieve optimal query complexity for convex optimization.

Joint work with Xi Chen.